Guideline: Client Satisfaction Evaluation
Capgemini success is based on the ability of each member of our staff to deliver client satisfaction. In order to maintain its position, Capgemini is committed to deliver services "On Time and At Client Expectations". The Client Satisfaction process evaluates whether Capgemini services meets Client expectation.

OTACE (On Time, At Client Expectations) is the mechanism to measure how well it is serving its clients on the specific contracted work. The Client Satisfaction status of engagements gives directional indications of where engagements are going well and where improvements can be made.

The process covers both responsible and non-responsible engagements.

Note: Non-responsible engagement is also referred as Staff Augment, Professional Service or Outsourcing Service in some of the units / regions.
Relationships
Main Description

Group Requirements

The Group-mandated requirements for Client Satisfaction process at the engagement level are as follows:

  • Engagements representing the Group defined target of at least 80% of the units or regions YTD revenue should be eligible for OTACE; and should be covered every year.
  • Units or Regions must decide and publish the rules so that engagements representing 80% of YTD revenue are eligible for Client Satisfaction process. Like;
    • Active during the period (including or excluding closed)
    • Responsible and Non-responsible engagement
    • Contract types - Fixed Price, Clusters of T&M, etc.
    • Size of minimum Threshold - Total Contract Value (TCV) >250 K Euros equivalent
    • Related to a minimum timeframe; the units / regions could determine a minimum timeframe: example with lifecycle > 3 months.
  • The evaluation frequency for both project type engagements and service type engagements.
    • New engagement is considered due for evaluation within;
      • 3 months of starting for Consulting.
      • 6 months of starting for Build.
      • 6 months or at the end of Transition (whichever is earlier).
      • 6 months of starting for Run.
      • 6 months of starting for Non-responsible
  • Existing engagement is considered due for evaluation every;
      • 3 months for Consulting.
      • 6 months for Build.
      • 12 months for Run.
      • 6 months for Non-responsible

Note: For single person engagements ensure conducting OTACE is compliant with applicable legal and regulatory requirements

OTACE Definition

Capgemini’s OTACE expands to On Time and At Client Expectations. OTACE is not about measuring contractual effectiveness. It enables us to engage with our clients to understand their business needs and expectations that help deliver lasting results with tangible benefits. Client satisfaction is measured at an engagement level in terms of OT- On Time (delivery) and ACE- At Client Expectations.

OT – ‘On Time’

On Time is the measure of timely delivery of solution or service by Capgemini engagements. In responsible engagements, OT measurement is about the client's view on the timeliness of our delivery measured against the agreed timelines, for a project or service engagement. The deliverables considered may be, for example: a complete engagement, phases of a project, formal reports in an engagement, application changes, daily reports, SLA or the up time of a system.

In non-responsible engagement, On Time is the measure of timeliness of providing Capgemini resources. This measure is optional for non-responsible engagements.

The measurement reported should be the client's perception of the timeliness of our delivery during the engagement. Engagements should report one of three conditions related to ‘On Time’:

  • ON TIME (Green)
  • LATE (Red)
  • BLANK (No current assessment).


ACE – ‘At Client Expectations’

ACE is the score based on the 5 criteria agreed with the client. This rating is designed to measure the level of client satisfaction with the engagement. ACE should reflect the client’s view of whether an engagement meets expectations within the bounds of the contract, and not the view of Capgemini internal people. Two types of indicator explain the ACE result – a colour coding indicating the overall status of ACE, and a calculated figure showing the overall ACE score.  

Colour

ACE score range

Green (0;176;0)

>4

Light Green(118;228;0)

>=3 to <=4

Amber (255;192;0)

>=2.7 to < 3

Red (255;0;0) 

<2.7

Criteria Guide for ACE Evaluation

To define client expectations, it is critical to understand the criteria on which the client will evaluate an engagement. It is recommended that each engagement identify 5 criteria to assess the ACE Score.

Category

Criteria

Description

Delivery

Ability to Anticipate

Ability to identify hidden needs, notify about events that may occur and analyse them to inform the Client and propose an action plan.

Delivery

Achievement of Commitments

Actions are achieved as committed; decisions made are implemented within committed timeframe and according to agreed outcomes.

Team

Adaptive Ability

Capability to change engagement organization and method to answer changes within project / service environment.

Team

Appropriate Skills

Team demonstrates technical, functional and business skills appropriate to the engagement needs, capability to quickly gain new abilities, listening, analysis and communication skills.

Client

Client Focused Attitude

Capgemini understands and takes into account client expectations, deliver with a focus on client needs, understand client constraints and priorities, showing a service attitude and aiming for high user satisfaction.

Delivery

Delivering Results

Capability to deliver quantifiable results and to organize teams' actions to reach this goal. Capability to monitor changes needed to achieve results.

Team

Duty to Advise

Continual improvement or spontaneous proposition of actions or solutions to improve project / service management, quality of deliverables and speed of delivery. Ability to alert at the right level on issues impacting engagement or company.

Delivery

Engagement and Budget Management

Manage engagement objectives, share engagement visibility with client, (Monitoring organizations, schedule, reporting, etc.), and mobilize teams on objectives and on budget.

Delivery

Knowledge Transfer

 

Provide the client with the appropriate level of information on the solution, or in Transition engagements, provide the client with the appropriate level of information during Transition In and Transition Out and on the solutions developed within the bounds of the contract.

Client

Linkage to your Strategic stakeholders

Understanding and taking into account business and financial stakeholders and their targets.  Like engagement’s contribution to fulfilment of strategic goals, within its sphere of influence.

Team

Professionalism of Capgemini People

 

Conduct oneself with the sense of responsibility, professional and personal ethics. Adhere to workplace policies, rules or regulations covering topics such as workplace safety, confidentiality, or information security and complete required trainings.

 

Delivery

Quality of Deliverables

Deliverables - content and presentation (documents, software, hardware, solutions and services). Services are delivered consistently to client quality requirements, as defined in the key performance indicators and service level agreement. Deliverables meet predefined acceptance criteria.

Delivery

Responsiveness

Quickly taking into account unpredictable events, ability to propose and implement the right solution within a timeframe compatible with proper engagement management.

Delivery

Security and privacy

Focus on the safety and security of the client's data, application and technical environment.

Delivery

Solution Effectiveness

Fulfilment of client needs as described in the agreement and key Client engagement forums - optimized and effective solution, appropriate to the business requirements.

Delivery

Support Business Transformation

Supporting the client organization in business change within the bounds of the contract.

Team

Value-added insights & Thought Leadership

Providing innovative and value-added ideas which are 'Out-of-Box’ thinking from the business perspective. Ability to lead and mobilize on those ideas.

Client

Working Together

Develop a high quality of relationship with client team members; promote mobilization and participation of everyone towards the project or service success.

Note:

  • Criteria ‘Working Together’ includes 'Cross Team Co-operation', 'How we work with your people' and 'Cross organizational integration'. 
  • Criteria ‘Appropriate Skills’ includes ‘Attention Given to Capability Development’.

In case of non-responsible engagements, while all the above ACE criteria and descriptions are relevant, the following subset of ACE criteria and descriptions are more relevant.

Category

Criteria

Description

Delivery

Ability to Anticipate

Ability to anticipate the need for additional skills or future requirements.

Delivery

Achievement of Commitments

Actions are achieved as committed; decisions made are implemented within committed timeframe and according to agreed outcomes.

Team

Appropriate Skills

Team demonstrates technical, functional and business skills appropriate to the engagement needs, capability to quickly gain new abilities, listening, analysis and communication skills.

Client

Client Focused Attitude

Capgemini understands and takes into account Client expectations, with a focus on client needs, Client constraints and priorities showing a service attitude.

Team

Professionalism of Capgemini People

Conduct oneself with the sense of responsibility, professional and personal ethics. Adhere to workplace policies, rules or regulations covering topics such as workplace safety, confidentiality, or information security and complete required trainings. 

Delivery

Responsiveness

Quickly taking into account unpredictable events, ability to propose and implement the right solution within the  timeframe.

Delivery

Security and privacy

Focus on the safety and security of the client's data, application and technical environment.

Client

Working Together

Develop a high quality of relationship with client team members; promote mobilization and participation of everyone towards the project or service success.

When the client defines criteria without referring to the table above, then the Engagement Manager should understand the details and map them as closely as possible to these Group criteria. This is to ensure consistency and to enable evaluation results to be published externally. If the criterion is unique and cannot be mapped to the Group-defined criteria set, then there is a provision to choose one ‘Free Criteria’ option in the Client Satisfaction evaluation (OTACE) form.

Weight Definition

Each criterion selected by the client must be weighted, according to the list below. The same weighting may be applied to several criteria.

  • 5 - Highest Priority
  • 4 - Very Important
  • 3 - Important
  • 2 - Less Important
  • 1 - Minor Importance.

Rating Definition (on the 5-point scale)

During every assessment, the client rates our performance against the 5 defined or re-defined criteria. Below is the definition of each rating factors on the 5-point scale.

  • 5 - Excellent
  • 4 - Good
  • 3 - Satisfactory
  • 2 - Disappointing
  • 1 - Poor.

ACE Score Computation and Interpretation

The Engagement ACE score is the sum of each criteria rating multiplied with its weight, and then divided by the Total Weight (in the example below the weight sum is 21) to take it back on the original scale. The result is ACE Score derived at a single decimal. Here is an illustrative example.

Criteria agreed with Client

Weight

Rate

Weighted Rate

A

4

4

16

B

4

4

16

C

5

3

15

D

4

3

12

E

4

4

16

Total Weight

21

Total Weighted Rate

75

Total Weighted Rate/Total Weight

=75/21

 

 

ACE Score=

3.6

The ACE score is interpreted and evaluated to check if the engagement has met client expectations; and to sustain Capgemini’s continuous efforts for improving the quality of our services.

Colour

ACE score range

Score Interpretation

Green

>4

Above Client expectation

Light Green

>=3 to <=4

At Client expectation

Amber

>=2.7 to < 3

Below Client expectation

Red

          <2.7

Far below Client expectation

When multiple client representatives respond to an engagement for the period then calculate the weighted average of all responses to arrive at single ACE score. For each response CSE Weight can be assigned on a scale of 1 to 5.

Net Promoter ScoreSM

NPS (Net Promoter Score) is a indicator of brand loyalty. It is calculated at the unit or group level based on responses to the below question. NPS indicates whether Capgemini is top of mind for our clients and helps predict future growth.

Q. On a scale of 0-10, how likely are you to recommend Capgemini to a colleague or a peer at another company?

NPS Rating

Rating Interpretation

9 - 10

Promoters – Client is happy with our service

They are loyal enthusiasts and might prove to be evangelists for Capgemini.

7 - 8

Passives – Client has average experience with our service

They have a neutral stand and are unlikely to promote Capgemini to others.

0 -6

Detractors – Client is not happy with our service.

They are unlikely to use Capgemini again and would not recommend Capgemini to others.


Note: Net Promoter ScoreSM is registered symbol of Bain & Company, Inc., Satmetrix Systems, Inc., and Fred Reichheld.

How to calculate NPS

NPS is not calculated as an average of responses. It’s calculated as the difference between the percentage of Promoters and Detractors. E.g., NPS = % Promoters - % Detractors. The scale for NPS is -100 to +100.

For example, 20 clients responded to the NPS question: 10 clients responded with 9 or 10 (Promoters); 6 clients responded with 7 or 8 (Passives) and the last 4 clients responded with <6 (Detractors). To calculate NPS:

% Promoters = 50% (10 clients out of 20)
% Detractors = 20% (4 clients out of 20)
NPS = 50-20 = +30

If there are more Promoters than Detractors, NPS is positive. If there are more Detractors than Promoters, NPS is negative.

Guidance to Engagements

Definition

The below details must be referred to in conjunction with the ‘UPM - Start-up phase’/ ‘USM - Transition Handover-In’ activity under the Client Relationship Management stream.

Definition of 5 criteria in line with Client’s expectations

To enable us to engage with our client and understand their business needs, we ask the client to define the performance criteria at the start of the engagement. During the delivery period, the client may change the criteria or the relevant weight based on potential contractual change, after a score has been provided. Below are the key pointers to consider during the criteria definition.

Understand the expectations from the engagement.

  • Key measure or indicators that are of importance to business.
  • Understand the criteria to evaluate how well the engagement performs.
  • Ensure the criteria are within the bounds of the contract and is also in line with the Account level expectation.
  • Map the criteria to the standard criteria set and agree with the client.

Client Satisfaction criteria selected by the client representative(s) should consider maximum of 5 criteria to evaluate an engagement.

Identification of Client participant(s)

The assessment is aimed at a small number of key client representative(s) to evaluate executive satisfaction with the outcomes of the engagement. Below are key pointers to consider while identifying client representative(s):

  • Has an understanding on what the engagement is supposed to achieve.
  • Can participate in criteria definition.
  • Will be involved in monitoring the engagement status.

Contact methods (I - Interview, T - Telephone, E - Email or Online- Client Portal interaction)

There are various methods to contact the client for the evaluation like Interview, Telephone, Email or Online-Client Portal interaction. Client Satisfaction Evaluation creates an opportunity to establish further connection with the client, to understand the current performance directly and to discuss other business opportunities. It is recommended that the Engagement Manager plans for a face-to-face meeting to conduct the evaluation.

Execution

The below details must be referred in conjunction with the ‘UPM - Execution’/ ‘USM - Service Run’ phase activity under the Client Relationship Management stream.

Conduct as per the plan

All engagements in the regional scope are assessed both for ‘On Time’ factor and on ACE score.
Engagement assessment must be run:

  • At the beginning of each engagement: This first assessment must be used to define ACE criteria with the client.
  • At defined period of each engagement: The assessment cycle shall be as per the Group recommended frequency. The periodicity depends of various criteria (size of the engagement, responsibility, nature of deliverables, etc.). Certain engagements may plan for a quarterly assessment, based on its criticality.
  • At the end of each engagement: This assessment must be run before the completion of the engagement as documented for the final client sign-off.

During the delivery period, if the client asks to change the criteria or their relative weight, it is acceptable only if the change is triggered at the start of the assessment period.

Deviation Handling (BLACK category)

When the client refuses to participate in the process, the engagement(s) must be classified as BLACK category. A formal communication must be received from the client with refusal to participate.

Client Administered Evaluation

If the client refuses to provide feedback in the OTACE standard format due to their internally exercised format, work with the client team to receive the feedback for the engagement(s) for the period. The Engagement Manager shall map the feedback to the OTACE mechanism to ensure coverage in reporting the results. All such engagement(s) within the account must have a common documented approach to map the client’s feedback to the Capgemini format in order to establish consistency in scoring interpretation and reporting.

Refer to the Appendix - Client Administered Evaluation, for mapping guidance.

Improvement action plans for rating or ACE score below <3

If an engagement receives the evaluation rating <3 for any of the 5 criteria or the overall ACE score is <3, the Engagement Manager shall identify the causes and perform corrective action (Refer to Corrective Action guideline). It is recommended that the Engagement Manager performs corrective action for qualitative comments that requiring attention. It is also recommended that the Engagement Manager performs an analysis where NPS <=6 and takes action, as required

Reporting

Each engagement shall report on:

  • 5 criteria agreed with the client.
  • Evaluation results – OT, ACE score and NPS with improvement plan of action, if any.

Consolidation and Reporting

Below metrics definition and reporting are applicable for both responsible and non-responsible engagements.

Operational Metrics Definition

All the below data and computations are based on revenue of engagements.

  • Eligible Engagements (A): Total revenue should meet the Group defined target of at least 80% of the BU’s or region’s revenue.
  • Engagements Covered under OTACE (B): Total revenue of engagements where the criteria are defined, at least 1 OTACE has been received or the new engagement which are still not due for evaluation.
  • Engagements not Covered under OTACE (C): Total revenue of engagements due for evaluation and not a single OTACE is received till date.
  • Engagements Evaluation Refused by Client (D): Total revenue of engagements for which the client refused to participate in Capgemini’s evaluation - Black category. If the client has shared the evaluation results based on their internal process, count them under (B).
  • Engagements OTACE Compliant (E): Total revenue of engagements where the OTACE has been received within the due period or the new engagement which are still not due for evaluation. It is a subset of Engagements Covered (B).
  • Engagements OTACE Non-compliant (F): Total revenue of engagements where the OTACE has not been received within the due period. It is a subset of Engagements Covered under OTACE (B).
  • Engagements ACE score Received for Current Due Period (G): Total revenue of engagements where the ACE score has been received within the due period. It is a subset of Engagements OTACE Compliant (E).
  • Engagement Not Due for Evaluation (H): Total revenue of new engagement which are still not due for evaluation.
  • Engagements ACE score Far Below Expectations (I): Total revenue of engagements where the ACE score is <2.7.
  • Engagements ACE score Below Expectations (J): Total revenue of engagements where the ACE score is >=2.7 to < 3.
  • Engagements ACE score At Expectations (K): Total revenue of engagements where the ACE score is >=3 to <=4.
  • Engagements ACE score Above Expectations (L): Total revenue of engagements where the ACE score is >4.
  • Engagements OT (On Time) = ’Yes’ (M): Total revenue of engagements where client has confirmed for On Time delivery of solution or service.
  • Engagements OT (On Time) = ’No’ (N): Total revenue of engagements where client has not confirmed for On -Time delivery of solution or service.
  • Engagements NPS Promoters (O): Total engagements where Clients are Promoters.
  • Engagements NPS Passives (P): Total engagements where Clients are Passives.
  • Engagements NPS Detractors (Q): Total engagements where Clients are Detractors.

Roll-up Metrics Definition

For revenue based Coverage, Compliance, NPS, On Time and At Client Expectations.
Roll-up metrics are computed as month over month cumulation for the year.

Definition

Metrics

Calculation

OTACE Coverage

% Coverage (on both unit’s or region’s revenue of engagements)

B/A * 100

OTACE Compliance

% Compliance

E/A * 100

OTACE Client Refused

% Client Refused (BLACK category)

D/A * 100

OTACE Compliance including BLACK category

% Compliance including BLACK category

((D + E)/A) * 100

OTACE Not Due

% of Not Due Engagements

H/E * 100

ACE (At Client Expectations)

% of compliant Engagements ACE score is Far Below Expectation

I/ G * 100

% of compliant Engagements ACE score is Below Expectation

J/ G * 100

% of compliant Engagements ACE score is At Expectation

K/G * 100

% of compliant Engagements ACE score is Above Expectation

L/G * 100

OT (On Time)

% of compliant Engagements Where On Time is marked 'Yes'.

M/G * 100

% of compliant Engagements Where On Time is marked 'No'.

N/G * 100

NPS Promoters

% of compliant Engagements where Client are Promoters

O/G * 100

NPS Passives

% of compliant Engagements where Client are Passives

P/G * 100

NPS Detractors

% of compliant Engagements where Client are Detractors

Q/G * 100

Client Satisfaction Evaluation report is published periodically both at unit / region level to the Delivery Management and at the Group level.

Group level reports are published quarterly.

Appendix

Client Administered Evaluation

When client refuses to participate in Capgemini’s assessment as the client’s organization internally conducts the assessment and the results are available, below is the possible scenarios and recommended approach for mapping the client’s feedback to the Capgemini format.Client administered evaluation score should be put into e-Monitoring as an OTACE score with the date.

Scenario 1: Client evaluation has criteria that can be mapped to the Group criteria

Map the criteria to appropriate Group criteria and compute the ACE score. The client's perception of the timeliness of our overall delivery can be mapped to the ‘On Time’ measure.

Scenario 2: Client rating scale is different from Capgemini rating scale

When the client rating scale is less than or greater than the 5-point scale, understand the rating description and categorize the client rating to establish the mapping to ACE score.

5 - Excellent
4 - Good
3 - Satisfactory
2 - Disappointing
1 - Poor.

Scenario 3: Single overall satisfaction score

If the client process captures a single score to evaluate the engagement performance, like NPS score, map it to the ACE score referring to the rating definition.

Scenario 4: Only qualitative statement

If the client process captures only qualitative comments to evaluate the engagement performance, map the score referring to the rating definition.